# Character-level BPE
Zenz V2.5 Medium
A GPT-2 architecture conditional language model designed specifically for Japanese kana-kanji conversion tasks, supporting context-aware conversion
Large Language Model Japanese
Z
Miwa-Keita
25
4
Herbert Base Cased
HerBERT is a Polish pre-trained language model based on the BERT architecture, trained using dynamic whole word masking and sentence structure objectives.
Large Language Model Other
H
allegro
84.18k
17
Herbert Large Cased
HerBERT is a Polish pre-trained language model based on the BERT architecture, trained using dynamic whole word masking and sentence structure objectives.
Large Language Model Other
H
allegro
1,272
6
Featured Recommended AI Models